Skip to main content

OV5647

This document mainly introduces the development and adaptation process of the camera, taking OV5647 as an example.

Camera Architecture Overview

Common Terms

TermExplanation
ISPImage Signal Processor, used for image denoising, white balance, color correction, etc.
BayerCamera sensor that captures images using a Bayer filter array.
CSICamera Serial Interface, used for data transmission between the camera and the processor.
V4L2Linux kernel framework for video capture and output.
libargusNVIDIA's advanced camera control library.
Camera Core LibraryProvides control and processing functions between application programs and kernel-mode V4L2 drivers.
GStreamerOpen-source multimedia processing framework for audio and video processing.
Host1xHost interface controller in NVIDIA Tegra SoC.
ApertureAperture control for adjusting the lens aperture size.
UVCUSB Video Class, used for USB camera drivers.
VIVideo Input module for receiving and processing video data.

Camera SW Architecture

    NVIDIA platform provides two camera driver architectures: Camera Core Library Interface and Direct V4L2 Interface. The main difference is that the former provides access to the ISP on the Jetson platform and advanced image processing functions, and exports related interfaces for libargus usage in the application space. Libargus provides simplified APIs for accessing camera functions and ISP on the Jetson platform. The nvgstcapture-1.0 tool used later is also based on this library. Since we need to use NVIDIA-related ISP functions later, we will develop and adapt the Camera Core Library Interface framework.

OV5647 Driver Adaptation Process

Prerequisites

  • Obtain the camera I2C address, confirm the wiring sequence, power and reset pins.

  • Obtain the camera power-on sequence, clock, supported resolutions, frame rates, and camera filter array (BGGR, GBRG, GRBG, RGGB).

  • Confirm the camera connection interface, corresponding to the appropriate tegra_sinterface, such as cam1 corresponding to serial_a and cam0 corresponding to serial_c on Jetson Orin NX.

This adaptation is based on the Jetson Orin NX platform, and the hardware connection interface is cam0.

Device Tree Configuration

  • The DTB of OV5647 is loaded using the overlay method, so create a DTS file in the kernel source code, the path is as follows:

    Linux_for_Tegra/source/hardware/nvidia/t23x/nv-public/overlay/tegra234-p3767-camera-p3768-ov5647.dts`
  • Create the tegra-camera-platform device tree node

    tegra-capture-vi  {
    num-channels = <1>;
    ports {
    #address-cells = <1>;
    #size-cells = <0>;
    vi_port1: port@1 {
    reg = <0>;
    rbpcv2_ov5647_vi_in1: endpoint {
    port-index = <2>;
    bus-width = <2>;
    remote-endpoint = <&rbpcv2_ov5647_csi_out1>;
    };
    };
    };
    };
  • Create rbpcv2_ov5647_c@36 device under the I2C node and fill in the I2C device address, mode clock, resolution, frame rate, etc. according to the datasheet

    rbpcv2_ov5647_c@36 {
    reset-gpios = <&gpio CAM0_RST GPIO_ACTIVE_HIGH>;
    pwdn-gpios = <&gpio CAM1_PWDN GPIO_ACTIVE_HIGH>;
    compatible = "ovti,ov5647";
    /* I2C device address */
    reg = <0x36>;
    /* V4L2 device node loation */
    devnode = "video0";
    /* Physical dimensions of sensor */
    physical_w = "3.670";
    physical_h = "2.740";
    sensor_model = "ov5647";
    use_sensor_mode_id = "true";

    clocks = <&bpmp TEGRA234_CLK_EXTPERIPH2>,
    <&bpmp TEGRA234_CLK_PLLP_OUT0>;
    clock-names = "extperiph2", "pllp_grtba";
    mclk = "extperiph2";
    clock-frequency = <24000000>;

    mode0 { /* ov5647_MODE0_1920x1080_30FPS */
    mclk_khz = "25000";
    num_lanes = "2";
    // lane_polarity = "4";
    tegra_sinterface = "serial_c";
    phy_mode = "DPHY";
    discontinuous_clk = "yes";
    dpcm_enable = "false";
    cil_settletime = "0";
    active_w = "1920";
    active_h = "1080";
    mode_type = "bayer";
    pixel_phase = "bggr";
    csi_pixel_bit_depth = "10";
    readout_orientation = "90";
    line_length = "2416";
    inherent_gain = "1";
    mclk_multiplier = "3.6669";
    pix_clk_hz = "81666663";
    gain_factor = "16";
    framerate_factor = "1000000";
    exposure_factor = "1000000";
    min_gain_val = "16";
    max_gain_val = "128";
    step_gain_val = "1";
    default_gain = "16";
    min_hdr_ratio = "1";
    max_hdr_ratio = "1";
    min_framerate = "30000000"; /* 30.0 fps */
    max_framerate = "30000000"; /* 30.0 fps */
    step_framerate = "1";
    default_framerate = "45000000"; /* 30.0 fps */
    min_exp_time = "60"; /* us */
    max_exp_time = "30000"; /* us */
    step_exp_time = "1";
    default_exp_time = "10000"; /* us */
    embedded_metadata_height = "0";
    };

    ........................

    ports {
    #address-cells = <1>;
    #size-cells = <0>;
    port@0 {
    reg = <0>;
    rbpcv2_ov5647_out1: endpoint {
    status = "okay";
    port-index = <1>;
    bus-width = <2>;
    remote-endpoint = <&rbpcv2_ov5647_csi_in1>;
    };
    };
    };

    };
  • Port binding configuration to ensure the video stream connection relationship.

    • The video data stream is generated from the OV5647 camera device, then transmitted to the CSI module for receiving, decoding, deserializing, and preliminary processing of video data. It is then transmitted to the VI module for further processing and formatting of video data, and finally transmitted to system memory or other processing units.

    • Connect the three modules through remote-endpoint.

      tegra-capture-vi  {
      num-channels = <1>;
      ports {
      #address-cells = <1>;
      #size-cells = <0>;
      vi_port1: port@1 {
      reg = <0>;
      rbpcv2_ov5647_vi_in1: endpoint {
      port-index = <2>;
      bus-width = <2>;
      remote-endpoint = <&rbpcv2_ov5647_csi_out1>;
      };
      };
      };
      };

      host1x@13e00000 {
      nvcsi@15a00000 {
      num-channels = <1>;
      #address-cells = <1>;
      #size-cells = <0>;
      csi_chan1: channel@1 {
      reg = <0>;
      ports {
      #address-cells = <1>;
      #size-cells = <0>;
      csi_chan1_port0: port@0 {
      reg = <0>;
      rbpcv2_ov5647_csi_in1: endpoint@2 {
      port-index = <2>;
      bus-width = <2>;
      remote-endpoint = <&rbpcv2_ov5647_out1>;
      };
      };
      csi_chan1_port1: port@1 {
      reg = <1>;
      rbpcv2_ov5647_csi_out1: endpoint@3 {
      remote-endpoint = <&rbpcv2_ov5647_vi_in1>;
      };
      };
      };
      };
      };
      };
  • Configure Makefile, /home/xuys8233/work/Linux_for_Tegra/source/hardware/nvidia/t23x/nv-public/overlay/Makefile, add the following information:

    dtbo-y += tegra234-p3767-camera-p3768-ov5647.dtbo
  • Compile to generate dtbo

    make dtbs

Driver Configuration

  • Add the driver file /home/xuys8233/work/Linux_for_Tegra/source/nvidia-oot/drivers/media/i2c/ov5647.c. This driver file can be modified based on nv_ov5693.c. The core points are as follows:

  • Register the camera driver through the NVIDIA provided tegracam_device_register function. The following three structures need to be filled:

    • tc_dev->sensor_ops, this structure contains the sensor-related operation functions, such as power on, power off, read and write registers, set mode, start and stop streaming, etc. These functions will be called when using NVIDIA's libargus API. Here is an example of this structure:

      static struct tegracam_sensor_ops ov5647_sensor_ops = {
      .power_on = ov5647_power_on,
      .power_off = ov5647_power_off,
      .write_reg = ov5647_write_reg,
      .read_reg = ov5647_read_reg,
      .set_mode = ov5647_set_mode,
      .start_streaming = ov5647_start_streaming,
      .stop_streaming = ov5647_stop_streaming,
      .sensor_init = ov5647_sensor_init,
      };

      The specific implementation of these functions needs to be written according to the OV5647 datasheet and related documents.

    • tc_dev->v4l2sd_internal_ops, this structure mainly implements the open function. If you need to use v4l2-ctl related tools in user space, you need to add .s_stream = ov5647_s_stream related function. Here is an example:

      static struct v4l2_subdev_internal_ops ov5647_subdev_internal_ops = {
      .open = ov5647_open,
      };
    • tc_dev->tcctrl_ops, this structure contains the camera control operation functions, such as setting gain, exposure, frame rate, etc. Here is an example:

      static struct tegracam_ctrl_ops ov5647_ctrl_ops = {
      .numctrls = ARRAY_SIZE(ctrl_cid_list),
      .ctrl_cid_list = ctrl_cid_list,
      .set_gain = ov5647_set_gain,
      .set_exposure = ov5647_set_exposure,
      .set_frame_rate = ov5647_set_frame_rate,
      .set_group_hold = ov5647_set_group_hold,
      };
  • Add the header file /home/xuys8233/work/Linux_for_Tegra/source/nvidia-oot/drivers/media/i2c/ov5647_modes_tbls.h, which mainly contains the register tables for sensor resolution, exposure time, etc. Here is an example:

    • This table will be registered by the tc_dev->sensor_ops structure.

    • Then configure the register table for each resolution, such as start_stream. Subsequent functions like set_modeset_gain, etc., will call the register configuration in this table for writing. (Note: The order of each resolution needs to be consistent with the mode order in the device tree)

  • Configure Makefile, modify /home/xuys8233/work/Linux_for_Tegra/source/nvidia-oot/drivers/media/i2c/Makefile

    obj-m += ov5647.o
  • Compile the module to generate ov5647.ko

    make modules

System Adaptation

  • Start the device, copy the compiled dtbo and ov5647.ko to the AIBOX via scp.
cd Linux_for_Tegra/source

scp ./kernel-devicetree/generic-dts/dtbs/tegra234-p3767-camera-p3768-ov5647.dtbo milesight@192.168.60.3:/home/milesight
scp ./kernel-devicetree/generic-dts/dtbs/tegra234-p3767-camera-p3768-ov5647.dtbo milesight@192.168.60.3:/home/milesight
  • Copy ov5647.ko to the modules directory and execute depmod to load the driver.
sudo cp ./ov5647.ko  /usr/lib/modules/5.15.148-tegra/updates/drivers/media/i2c/

depmod
  • Copy the dtbo to the /boot directory and start the dtbo using config-by-hardware.py, as shown below.

  • After completing, restart the development board.

Debugging Ideas

  • Confirm if the dtbo is loaded successfully

    • Execute cat /boot/extlinux/extlinux.conf to confirm if the dtbo is loaded.
    • Execute cat /proc/device-tree/tegra-camera-platform/modules/module1/drivernode0/sysfs-device-tree to confirm if the dtbo is effective. (The dtbo may be configured in extlinux.conf but not necessarily used. Errors in the dtbo may prevent it from taking effect, and such errors will not be displayed, so this step is necessary.)
  • Confirm if the driver is loaded successfully

    • Execute lsmod | grep ov5647 to confirm if the ov5647 driver is present.
    • Execute sudo dmesg | grep ov5647 to check for errors during driver loading.
    • Use sudo i2cdetect -y -r 9 to confirm if the device matches successfully.
  • After confirming no issues, run nvgstcapture-1.0 to call the sensor.

    • Confirm if the listed resolutions match the modes in the device tree.
    • Check for other error messages. If nvbuf_utils appears, check for error messages in dmesg.
  • Use sudo media-ctl -p -d /dev/media0 to check for issues in the video stream link.

References

Sensor Software Driver Programming

ov5647_driver/ov5647.c at main · mkm684/ov5647_driver